Convergence of Conditional Metropolis-Hastings Samplers

نویسندگان

  • Galin L. Jones
  • Gareth O. Roberts
  • Jeffrey S. Rosenthal
چکیده

We consider Markov chain Monte Carlo algorithms which combine Gibbs updates with Metropolis-Hastings updates, resulting in a conditional Metropolis-Hastings sampler (CMH). We develop conditions under which the CMH will be geometrically or uniformly ergodic. We illustrate our results by analysing a CMH used for drawing Bayesian inferences about the entire sample path of a diffusion process, based only upon discrete observations.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence of Conditional Metropolis-Hastings Samplers, with an Application to Inference for Discretely-Observed Diffusions

We consider Markov chain Monte Carlo algorithms which combine Gibbs updates with Metropolis-Hastings updates, resulting in a conditional Metropolis-Hastings sampler. We develop conditions under which this sampler will be geometrically or uniformly ergodic. We apply our results to an algorithm for drawing Bayesian inferences about the entire sample path of a diffusion process, based only upon di...

متن کامل

Uniform Ergodicity of the Iterated Conditional SMC and Geometric Ergodicity of Particle Gibbs samplers

We establish quantitative bounds for rates of convergence and asymptotic variances for iterated conditional sequential Monte Carlo (i-cSMC) Markov chains and associated particle Gibbs samplers [1]. Our main findings are that the essential boundedness of potential functions associated with the i-cSMC algorithm provide necessary and sufficient conditions for the uniform ergodicity of the i-cSMC M...

متن کامل

Adaptive Independent Metropolis-Hastings by Fast Estimation of Mixtures of Normals

Adaptive Metropolis-Hastings samplers use information obtained from previous draws to tune the proposal distribution. The tuning is carried out automatically, often repeatedly, and continues after the burn-in period. Because the resulting chain is not Markovian, adaptation needs to be done carefully to ensure convergence to the correct ergodic distribution. In this paper we distill recent theor...

متن کامل

A Gibbs Sampler for Learning DAGs

We propose a Gibbs sampler for structure learning in directed acyclic graph (DAG) models. The standard Markov chain Monte Carlo algorithms used for learning DAGs are random-walk Metropolis-Hastings samplers. These samplers are guaranteed to converge asymptotically but often mix slowly when exploring the large graph spaces that arise in structure learning. In each step, the sampler we propose dr...

متن کامل

Two convergence properties of hybrid samplers

Theoretical work on Markov chain Monte Carlo (MCMC) algorithms has so far mainly concentrated on the properties of simple algorithms such as the Gibbs sampler, or the full-dimensional Hastings-Metropolis algorithm. In practice, these simple algorithms are used as building blocks for more sophisticated methods, which we shall refer to as hybrid samplers. It is often hoped that good convergence p...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013